Convergence analysis of a block preconditioned steepest descent eigensolver with implicit deflation

نویسندگان

چکیده

Gradient-type iterative methods for solving Hermitian eigenvalue problems can be accelerated by using preconditioning and deflation techniques. A preconditioned steepest descent iteration with implicit (PSD-id) is one of such methods. The convergence behavior the PSD-id recently investigated based on pioneering work Samokish method (PSD). resulting non-asymptotic estimates indicate a superlinear under strong assumptions initial guess. present paper utilizes an alternative analysis PSD Neymeyr much weaker assumptions. We embed Neymeyr's approach into restricted formulation PSD-id. More importantly, we extend new to practically preferred block version PSD-id, or BPSD-id, show cluster robustness BPSD-id. Numerical examples are provided validate theoretical estimates.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Geometric Convergence Theory for the Preconditioned Steepest Descent Iteration

Preconditioned gradient iterations for very large eigenvalue problems are efficient solvers with growing popularity. However, only for the simplest preconditioned eigensolver, namely the preconditioned gradient iteration (or preconditioned inverse iteration) with fixed step size, sharp non-asymptotic convergence estimates are known. These estimates require a properly scaled preconditioner. In t...

متن کامل

Convergence analysis of a locally accelerated preconditioned steepest descent method for Hermitian-definite generalized eigenvalue problems

By extending the classical analysis techniques due to Samokish, Faddeev and Faddeeva, and Longsine and McCormick among others, we prove the convergence of preconditioned steepest descent with implicit deflation (PSD-id) method for solving Hermitian-definite generalized eigenvalue problems. Furthermore, we derive a nonasymptotic estimate of the rate of convergence of the PSD-id method. We show t...

متن کامل

The Block Preconditioned Steepest Descent Iteration for Elliptic Operator Eigenvalue Problems

The block preconditioned steepest descent iteration is an iterative eigensolver for subspace eigenvalue and eigenvector computations. An important area of application of the method is the approximate solution of mesh eigenproblems for self-adjoint and elliptic partial differential operators. The subspace iteration allows to compute some of the smallest eigenvalues together with the associated i...

متن کامل

Strong Weak Convergence Theorems of Implicit Hybrid Steepest-descent Methods for Variational Inequalities

Assume that F is a nonlinear operator on a real Hilbert space H which is strongly monotone and Lipschitzian with constants η > 0 and κ > 0, respectively on a nonempty closed convex subset C of H . Assume also that C is the intersection of the fixed point sets of a finite number of nonexpansive mappings on H . We develop an implicit hybrid steepest-descent method which generates an iterative seq...

متن کامل

Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method

Numerical solution of extremely large and ill conditioned eigenvalue problems is attracting a growing attention recently as such problems are of major importance in applications. They arise typically as discretization of continuous models described by systems of partial differential equations (PDE’s). For such problems, preconditioned matrix-free eigensolvers are especially effective as the sti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Linear Algebra With Applications

سال: 2023

ISSN: ['1070-5325', '1099-1506']

DOI: https://doi.org/10.1002/nla.2498